Web Survey Bibliography
All social surveys suffer from different types of errors, of which one of the most studied is non-response bias. Non-response bias is a systematic error that occurs because individuals differ in their accessibility and propensity to participate in a survey according to their own characteristics as well as those from the survey itself. The extent of the problem heavily depends on the correlation between response mechanisms and key survey variables. However, non-response bias is difficult to measure or to correct for due to the lack of relevant data about the whole target population or sample. In this paper, non-response follow-up surveys are considered as a possible source of information about non-respondents. Non-response follow-ups, however, suffer from two methodological issues: they themselves operate through a response mechanism that can cause potential non-response bias, and they pose a problem of comparability of measure, mostly because the survey design differs between main survey and non-response follow-up. In order to detect possible bias, the survey variables included in non-response surveys have to be related to the mechanism of participation, but not be sensitive to measurement effects due to the different designs. Based on accumulated experience of four similar non-response follow-ups, we studied the survey variables that fulfill these conditions. We differentiated socio-demographic variables that are measurement-invariant but have a lower correlation with non-response and variables that measure attitudes, such as trust, social participation, or integration in the public sphere, which are more sensitive to measurement effects but potentially more appropriate to account for the non-response mechanism. Our results show that education level, work status, and living alone, as well as political interest, satisfaction with democracy, and trust in institutions are pertinent variables to include in non-response follow-ups of general social surveys.
All social surveys suffer from different types of errors, of which one of the most studied is non-response bias. Non-response bias is a systematic error that occurs because individuals differ in their accessibility and propensity to participate in a survey according to their own characteristics as well as those from the survey itself. The extent of the problem heavily depends on the correlation between response mechanisms and key survey variables. However, non-response bias is difficult to measure or to correct for due to the lack of relevant data about the whole target population or sample. In this paper, non-response follow-up surveys are considered as a possible source of information about non-respondents. Non-response follow-ups, however, suffer from two methodological issues: they themselves operate through a response mechanism that can cause potential non-response bias, and they pose a problem of comparability of measure, mostly because the survey design differs between main survey and non-response follow-up. In order to detect possible bias, the survey variables included in non-response surveys have to be related to the mechanism of participation, but not be sensitive to measurement effects due to the different designs. Based on accumulated experience of four similar non-response follow-ups, we studied the survey variables that fulfill these conditions. We differentiated socio-demographic variables that are measurement-invariant but have a lower correlation with non-response and variables that measure attitudes, such as trust, social participation, or integration in the public sphere, which are more sensitive to measurement effects but potentially more appropriate to account for the non-response mechanism. Our results show that education level, work status, and living alone, as well as political interest, satisfaction with democracy, and trust in institutions are pertinent variables to include in non-response follow-ups of general social surveys. - See more at: https://ojs.ub.uni-konstanz.de/srm/article/view/6138#sthash.CEiOvCVB.dpuf
Web survey bibliography - 2015 (291)
- Effects of Mobile versus PC Web on Survey Response Quality: a Crossover Experiment in a Probability...; 2017; Antoun, C.; Couper, M. P.; G. G.Conrad, F. G.
- When will Nonprobability Surveys Mirror Probability Surveys? Considering Types of Inference and Weighting...; 2016; Pasek, J.
- Distractions: The Incidence and Consequences of Interruptions for Survey Respondents ; 2016; Ansolabehere, S.; Schaffner, B. F.
- The Effect of CATI Questions, Respondents, and Interviewers on Response Time; 2016; Olson, K.; Smyth, J. D.
- Linearization Variance Estimators for Mixed ‒ mode Survey Data when Response Indicators are Modeled...; 2016; Demnati, A.
- Adaptive survey designs to minimize survey mode effects – a case study on the Dutch Labor Force...; 2016; Calinescu, M.; Schouten, B.
- What is the gain in a probability-based online panel to provide Internet access to sampling units that...; 2016; Revilla, M.; Cornilleau, A.; Cousteaux, A-S.; Legleye, S; de Pedraza, P.
- Representative web-survey!; 2016; Linde, P.
- Assessing targeted approach letters: effects in different modes on response rates, response speed and...; 2016; Lynn, P.
- New Generation of Online Questionnaires?; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- The Analysis of Respondent’s Behavior toward Edit Messages in a Web Survey; 2016; Park, Y.
- Refining the Web Response Option in the Multiple Mode Collection of the American Community Survey; 2016; Hughes, T.; Tancreto, J.
- The Utility of an Online Convenience Panel for Reaching Rare and Dispersed Populations; 2016; Sell, R.; Goldberg, S.; Conron, K.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Implementation of Web-Based Respondent Driven Sampling among Men Who Have Sex with Men in Sweden; 2016; Stroemdahl, S.; Lu, X.; Bengtsson, L.; Liljeros, F.; Thorson, A.
- Recommended Practices for the design of business surveys questionnaires; 2016; Macchia, S.
- Web-based versus Paper-based Survey Data: An Estimation of Road Users’ Value of Travel Time Savings...; 2016; Kato, H.; Sakashita, A.; Tsuchiya, Tak.
- Reminder Effect and Data Usability on Web Questionnaire Survey for University Students; 2016; Oishi, T.; Mori, M.; Takata, E.
- Feasibility of using a multilingual web survey in studying the health of ethnic minority youth.; 2016; Kinnunen, J. M.; Malin, M.; Raisamo, S. U.; Lindfors, P. L.; Pere, L. A.; Rimpelae, A. H.
- Respondents of a follow-up web-based survey; 2016; Stoddard, S. A.; Amparo, P.; Popick, H.; Yudd, R.; Sujeer, A.; Baath, M.
- Is One More Reminder Worth It? If So, Pick Up the Phone: Findings from a Web Survey; 2016; Lin-Freeman, L.
- Reducing Underreports of Behaviors in Retrospective Surveys: The Effects of Three Different Strategies...; 2016; Lugtig, P. J.; Glasner, T.; Boeve, A.
- What drives the participation in a monthly research web panel? The experience of ELIPSS, a French random...; 2016; Legleye, S; Cornilleau, A.; Razakamanana, N.
- When Should I Call You? An Analysis of Differences in Demographics and Responses According to Respondents...; 2016; Vicente, P.; Lopes, I.
- The use and positioning of clarification features in web surveys; 2016; Metzler, A., Kunz, T., Fuchs, M.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- Mail merge can be used to create personalized questionnaires in complex surveys. ; 2016; Taljaard, M.; Chaudhry, S. H.; Brehaut, J. C.; Weijer, C.; Grimshaw, J. M.
- Electronic and paper based data collection methods in library and information science research: A comparative...; 2016; Tella, A.
- Stable Relationships, Stable Participation? The Effects of Partnership Dissolution and Changes in Relationship...; 2016; Mueller, B.; Castiglioni, L.
- Identifying Pertinent Variables for Nonresponse Follow-Up Surveys. Lessons Learned from 4 Cases in Switzerland...; 2016; Vandenplas, C.; Joye, D.; Staehli, M. E.; Pollien, A.
- The 2013 Census Test: Piloting Methods to Reduce 2020 Census Costs; 2016; Walejko, G. K.; Miller, P. V.
- Methods can matter: Where Web surveys produce different results than phone interviews; 2016; Keeter, S.
- Sunday shopping – The case of three surveys; 2016; Bethlehem, J.
- Will They Stay or Will They Go? Personality Predictors of Dropout in Online Study; 2016; Nestler, S.; Thielsch, M.; Vasilev, E.; Back, M.
- HUFFPOLLSTER: Why Reaching Latinos Is A Challenge For Pollsters; 2016; Jackson, N. M.; Edwards-Levy, A.; Velencia, J.
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- Revisiting “yes/no” versus “check all that apply”: Results from a mixed modes...; 2016; Nicolaas, G.; Campanelli, P.; Hope, S.; Jaeckle, A.; Lynn, P.
- Moderators of Candidate Name-Order Effects in Elections: An Experiment; 2016; Kim, Nu.; Krosnick, J. A.; Casasanto, D.
- Predictive inference for non-probability samples: a simulation study ; 2016; Buelens, B.; Burger, J.; van den Brakel, J.
- Equivalence of paper-and-pencil and computerized self-report surveys in older adults; 2016; Weigold, A.; Weigold, I. K.; Drakeford, M. K.; Dykema, S. A.; Smith, C. A.
- Quality of Different Scales in an Online Survey in Mexico and Colombia; 2016; Revilla, M.; Ochoa, C.
- Swapping bricks for clicks: Crowdsourcing longitudinal data on Amazon Turk; 2016; Daly, T. M.; Nataraajan, R.
- A reliability analysis of Mechanical Turk data; 2016; Rouse, S. V.
- Quota Controls in Survey Research.; 2016; Gittelman, S. H.; Thomas, R. K.; Lavrakas, P. J.; Lange, V.
- Computers, Tablets, and Smart Phones: The Truth About Web-based Surveys; 2016; Merle, P.; Gearhart, S.; Craig, C.; Vandyke, M.; Brooks, M. E.; Rahimi, M.
- Scientific Surveys Based on Incomplete Sampling Frames and High Rates of Nonresponse; 2016; Fahimi, M.; Barlas, F. M.; Thomas, R. K.; Buttermore, N. R.
- Taming Big Data: Using App Technology to Study Organizational Behavior on Social Media; 2015; Bail, C. A.
- The Use of a Nonprobability Internet Panel to Monitor Sexual and Reproductive Health in the General...; 2015; Legleye, S; Charrance, G.; Razafindratsima, N.; Bajos, N.; Bohet, A.; Moreau, C.
- Adapting Labour Force Survey questions from interviewer-administered modes for web self-completion in...; 2015; Betts, P.; Cubbon, B.
- ESOMAR/GRBN Online Research Guideline; 2015